crisis counselor
How suicide prevention is getting a boost from artificial intelligence: Exclusive
Suicide prevention is getting a boost from artificial intelligence. The Trevor Project, the world's largest suicide prevention and crisis intervention organization for LGBTQ youth, has launched a "Crisis Contact Simulator" to help train counselors and prepare them to support youth in crisis. Developed in collaboration with Google, the first-of-its-kind technology is an AI-powered counselor training tool that simulates digital conversations and allows trainees to practice realistic conversations with youth personas. "Riley," the organization's first Crisis Contact Simulator persona, emulates messages from a teen in North Carolina who feels anxious and depressed. In addition to Riley, the organization is currently developing a variety of personas that represent a wide range of life situations, backgrounds, sexual orientations, gender identities and risk levels.
- North America > United States > North Carolina (0.25)
- North America > United States > New York (0.05)
- North America > United States > California > Los Angeles County > Los Angeles (0.05)
How The Trevor Project is using AI to help prevent suicide
Suicide disproportionately affects LGBTQ youth. In the U.S. alone, more than 1.8 million LGBTQ youth between the ages of 13 and 24 seriously consider suicide or experience a significant crisis each year. Additionally, LGBTQ youth are over four times more likely to attempt suicide than their peers, while up to 50 percent of all trans people have made a suicide attempt--most before the age of 25. Black LGBTQ young people are even more impacted as they hold multiple marginalized identities, and research shows that Black youth ages five to 12 are dying by suicide at roughly twice the rate of their white peers. To support this particularly vulnerable and diverse community, The Trevor Project takes an intersectional approach to crisis intervention and suicide prevention.
AI Analysis Gives Guidance to Crisis Counselors
A study by Cornell University researchers and the Crisis Text Line crisis-counselor platform described how volunteer crisis counselors' use of language evolves. The team used state-of-the-art natural language processing to learn that the language employed by counselors systematically changes, based on their training and empathy for callers in distress, giving rise to unique voices for calming those distressed individuals. The researchers analyzed more than 1 million anonymized texts from about 3,500 counselors on the Crisis Text Line. Crisis Text Line's Robert Filbin said the study' insights will help the platform train and guide crisis counselors. Cornell's Cristian Danescu-Niculescu-Mizil said, "This is an example of how natural language processing techniques can assist the development of skills in conversation-heavy professions."
How data scientists are using AI for suicide prevention
Deciding whom to help first can be a life-or-death decision. At the Crisis Text Line, a text messaging-based crisis counseling hotline, these deluges have the potential to overwhelm the human staff. So data scientists at Crisis Text Line are using machine learning, a type of artificial intelligence, to pull out the words and emojis that can signal a person at higher risk of suicide ideation or self-harm. The computer tells them who on hold needs to jump to the front of the line to be helped. They can do this because Crisis Text Line does something radical for a crisis counseling service: It collects a massive amount of data on the 30 million texts it has exchanged with users.
How data scientists are using AI for suicide prevention
Deciding whom to help first can be a life-or-death decision. At the Crisis Text Line, a text messaging-based crisis counseling hotline, these deluges have the potential to overwhelm the human staff. So data scientists at Crisis Text Line are using machine learning, a type of artificial intelligence, to pull out the words and emojis that signal a person at higher risk of suicide ideation or self-harm. The computer tells them who on hold needs to jump to the front of the line to be rescued. They can do this because Crisis Text Line does something radical for a crisis counseling service: It collects a massive amount of data on the 30 million texts it has exchanged with users.